Steepest descent method for quasiconvex minimization on Riemannian manifolds

نویسندگان

  • E. A. Papa Quiroz
  • E. M. Quispe
  • Roberto Oliveira
چکیده

This paper extends the full convergence of the steepest descent algorithm with a generalized Armijo search and a proximal regularization to solve quasiconvex minimization problems defined on complete Riemannian manifolds. Previous convergence results are obtained as particular cases of our approach and some examples in non Euclidian spaces are given.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Steepest descent method on a Riemannian manifold: the convex case

In this paper we are interested in the asymptotic behavior of the trajectories of the famous steepest descent evolution equation on Riemannian manifolds. It writes ẋ (t) + gradφ (x (t)) = 0. It is shown how the convexity of the objective function φ helps in establishing the convergence as time goes to infinity of the trajectories towards points that minimize φ. Some numerical illustrations are ...

متن کامل

ε-subgradient algorithms for locally lipschitz functions on Riemannian manifolds

This paper presents a descent direction method for finding extrema of locally Lipschitz functions defined on Riemannian manifolds. To this end we define a set-valued mapping x → ∂εf(x) named ε-subdifferential which is an approximation for the Clarke subdifferential and which generalizes the Goldstein-ε-subdifferential to the Riemannian setting. Using this notion we construct a steepest descent ...

متن کامل

Fitting Curves on Riemannian Manifolds Using Energy Minimization

Given data points p0, . . . , pN on a Riemannian manifold M and time instants 0 = t0 < t1 < . . . < tN = 1, we consider the problem of finding the curve γ on M that best approximates the data points at the given instants. In this work, γ is expressed as the curve that minimizes the weighted sum of a least-squares term penalizing the lack of fitting to the data points and a regularity term defin...

متن کامل

Hybrid steepest-descent method with sequential and functional errors in Banach space

Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...

متن کامل

Proximal Point Methods for Functions Involving Lojasiewicz, Quasiconvex and Convex Properties on Hadamard Manifolds

This paper extends the full convergence of the proximal point method with Riemannian, Semi-Bregman and Bregman distances to solve minimization problems on Hadamard manifolds. For the unconstrained problem, under the assumptions that the optimal set is nonempty and the objective function is continuous and either quasiconvex or satisfies a generalized Lojasiewicz property, we prove the full conve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006